en
每月不到10元,就可以无限制地访问最好的AIbase。立即成为会员
Home
News
Daily Brief
Income Guide
Tutorial
Tools Directory
Product Library
en
Search AI Products and News
Explore worldwide AI information, discover new AI opportunities
AI News
AI Tools
AI Cases
AI Tutorial
Type :
AI News
AI Tools
AI Cases
AI Tutorial
2024-08-22 08:40:49
.
AIbase
.
11.2k
PaddlePaddle Framework 3.0 Introduces Unified Automatic Parallelism, Simplifying Large Model Training Development
The release of PaddlePaddle Framework 3.0 features a core upgrade to unified automatic parallelism technology, aimed at simplifying the distributed training process for large models and improving development efficiency. The new version supports mixed parallelism techniques from four-dimensional to five-dimensional, employing various parallel methods such as data parallelism, tensor model parallelism, pipeline parallelism, and grouped parameter slicing parallelism, effectively enhancing the training efficiency of large models. The automatic parallelism technology uses tensor slicing syntax to automatically infer distributed slicing states and add communication operators, reducing the difficulty of development. The principles of automatic parallelism include distributed tensor representation and slicing inference.
2024-01-10 14:31:01
.
AIbase
.
4.8k
Scientists Innovate Technology to Successfully Train Trillion-Parameter Models at ChatGPT Level
Scientists successfully trained a model at ChatGPT level using only 8% of the computing power. The research team utilized innovative technology on the Frontier supercomputer, training a trillion-parameter language model with just thousands of AMD GPUs. They achieved 100% weak scaling efficiency for 175 billion-parameter and trillion-parameter models while utilizing only 8% of computing capacity. The results indicate that training large language models with trillion parameters remains challenging.